Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Lawmakers have started to regulate “dark patterns,” understood to be design practices meant to influence technology users’ decisions through manipulative or deceptive means. Most agree that dark patterns are undesirable, but open questions remain as to which design choices should be subjected to scrutiny, much less the best way to regulate them. In this Article, we propose adapting the concept of dark patterns to better fit legal frameworks. Critics allege that the legal conceptualizations of dark patterns are overbroad, impractical, and counterproductive. We argue that law and policy conceptualizations of dark patterns suffer from three deficiencies: First, dark patterns lack a clear value anchor for cases to build upon. Second, legal definitions of dark patterns overfocus on individuals and atomistic choices, ignoring de minimis aggregate harms and the societal implications of manipulation at scale. Finally, the law has struggled to articulate workable legal thresholds for wrongful dark patterns. To better regulate the designs called dark patterns, lawmakers need a better conceptual framing that bridges the gap between design theory and the law’s need for clarity, flexibility, and compatibility with existing frameworks. We argue that wrongful self-dealing is at the heart of what most consider to be “dark” about certain design patterns. Taking advantage of design affordances to the detriment of a vulnerable party is disloyal. To that end, we propose disloyal design as a regulatory framing for dark patterns. In drawing from established frameworks that prohibit wrongful self-dealing, we hope to provide more clarity and consistency for regulators, industry, and users. Disloyal design will fit better into legal frameworks and better rally public support for ensuring that the most popular tools in society are built to prioritize human values.more » « lessFree, publicly-accessible full text available June 30, 2026
- 
            Free, publicly-accessible full text available April 25, 2026
- 
            In recent years, gig work platforms have gained popularity as a way for individuals to earn money; as of 2021, 16% of Americans have at some point earned money from such platforms. Despite their popularity and their history of unfair data collection practices and worker safety, little is known about the data collected from workers (and users) by gig platforms and about the privacy dark pattern designs present in their apps. This paper presents an empirical measurement of 16 gig work platforms' data practices in the U.S. We analyze what data is collected by these platforms, and how it is shared and used. Finally, we consider how these practices constitute privacy dark patterns. To that end, we develop a novel combination of methods to address gig-worker-specific challenges in experimentation and data collection, enabling the largest in-depth study of such platforms to date. We find extensive data collection and sharing with 60 third parties—including sharing reversible hashes of worker Social Security Numbers (SSNs)—along with dark patterns that subject workers to greater privacy risk and opportunistically use collected data to nag workers in off-platform messages. We conclude this paper with proposed interdisciplinary mitigations for improving gig worker privacy protections. After we disclosed our SSN-related findings to affected platforms, the platforms confirmed that the issue had been mitigated. This is consistent with our independent audit of the affected platforms. Analysis code and redacted datasets will be made available to those who wish to reproduce our findings.more » « lessFree, publicly-accessible full text available January 1, 2026
- 
            Privacy law is failing to protect individuals from being watched and exposed, despite stronger surveillance and data protection rules. The problem is that our rules look to social norms to set thresholds for privacy violations, but people can get used to being observed. In this Article, we argue that by ignoring de minimis privacy encroachments, the law is complicit in normalizing surveillance. Privacy law helps acclimate people to being watched by ignoring smaller, more frequent, and more mundane privacy diminutions. We call these reductions “privacy nicks,” like the proverbial “thousand cuts” that lead to death. Privacy nicks come from the proliferation of cameras and biometric sensors on doorbells, glasses, and watches, and the drift of surveillance and data analytics into new areas of our lives like travel, exercise, and social gatherings. Under our theory of privacy nicks as the Achilles heel of surveillance law, invasive practices become routine through repeated exposures that acclimate us to being vulnerable and watched in increasingly intimate ways. With acclimation comes resignation, and this shift in attitude biases how citizens and lawmakers view reasonable measures and fair tradeoffs. Because the law looks to norms and people’s expectations to set thresholds for what counts as a privacy violation, the normalization of these nicks results in a constant renegotiation of privacy standards to society’s disadvantage. When this happens, the legal and social threshold for rejecting invasive new practices keeps getting redrawn, excusing ever more aggressive intrusions. In effect, the test of what privacy law allows is whatever people will tolerate. There is no rule to stop us from tolerating everything. This Article provides a new theory and terminology to understand where privacy law falls short and suggests a way to escape the current surveillance spiral.more » « less
- 
            Deceptive, manipulative, and coercive practices are deeply embedded in our digital experiences, impacting our ability to make informed choices and undermining our agency and autonomy. These design practices—collectively known as “dark patterns” or “deceptive patterns”—are increasingly under legal scrutiny and sanctions, largely due to the efforts of human-computer interaction scholars that have conducted pioneering research relating to dark patterns types, definitions, and harms. In this workshop, we continue building this scholarly community with a focus on organizing for action. Our aims include: (i) building capacity around specific research questions relating to methodologies for detection; (ii) characterization of harms; and (iii) creating effective countermeasures. Through the outcomes of the workshop, we will connect our scholarship to the legal, design, and regulatory communities to inform further legislative and legal action.more » « less
- 
            Internet-of-Things (IoT) devices are ubiquitous, but little attention has been paid to how they may incorporate dark patterns despite consumer protections and privacy concerns arising from their unique access to intimate spaces and always-on capabilities. This paper conducts a systematic investigation of dark patterns in 57 popular, diverse smart home devices. We update manual interaction and annotation methods for the IoT context, then analyze dark pattern frequency across device types, manufacturers, and interaction modalities. We find that dark patterns are pervasive in IoT experiences, but manifest in diverse ways across device traits. Speakers, doorbells, and camera devices contain the most dark patterns, with manufacturers of such devices (Amazon and Google) having the most dark patterns compared to other vendors. We investigate how this distribution impacts the potential for consumer exposure to dark patterns, discuss broader implications for key stakeholders like designers and regulators, and identify opportunities for future dark patterns study.more » « less
- 
            Technology ethics is increasingly at the forefront of human-computer interaction scholarship, with increasing visibility not only to end users of technology, but also regulators, technology practitioners, and platforms. The notion of “dark patterns” has emerged as one common framing of technology manipulation, describing instances where psychological or perceptual tricks are used to decrease user agency and autonomy. In this panel, we have assembled a group of highly diverse early-career scholars that have built a transdisciplinary approach to scholarship on dark patterns, engaging with a range of socio-technical approaches and perspectives. Panelists will discuss their methodological approaches, key research questions to be considered in this emerging area of scholarship, and necessary connections between and among disciplinary perspectives to engage with the diverse constituencies that frame the creation, use, and impacts of dark patterns.more » « less
- 
            Growth hacking, particularly within the spectre of surveillance capitalism, has led to the widespread use of deceptive, manipulative, and coercive design techniques in the last decade. These challenges exist at the intersection of many diferent technology professions that are rapidly evolving and “shapeshifting” their design practices to confront emerging regulation. A wide range of scholars have increasingly addressed these challenges through the label “dark patterns,” describing the content of deceptive and coercive design practices, the ubiquity of these patterns in contemporary digital systems, and the impact of emerging regulatory and legislative action on the presence of dark patterns. Building on this convergent and trans-disciplinary research area, the aims of this SIG are to: 1) Provide an opportunity for researchers and practitioners to address methodologies for detecting, characterizing, and regulating dark patterns; 2) Identify opportunities for additional empirical work to characterize and demonstrate harms related to dark patterns; and 3) Aid in convergence among HCI, design, computational, regulatory, and legal perspectives on dark patterns. These goals will enable an internationally-diverse, engaged, and impactful research community to address the threats of dark patterns on digital systems.more » « less
- 
            Deceptive design patterns (sometimes called “dark patterns”) are user interface design elements that may trick, deceive, or mislead users into behaviors that often benefit the party implementing the design over the end user. Prior work has taxonomized, investigated, and measured the prevalence of such patterns primarily in visual user interfaces (e.g., on websites). However, as the ubiquity of voice assistants and other voice-assisted technologies increases, we must anticipate how deceptive designs will be (and indeed, are already) deployed in voice interactions. This paper makes two contributions towards characterizing and surfacing deceptive design patterns in voice interfaces. First, we make a conceptual contribution, identifying key characteristics of voice interfaces that may enable deceptive design patterns, and surfacing existing and theoretical examples of such patterns. Second, we present the findings from a scenario-based user survey with 93 participants, in which we investigate participants’ perceptions of voice interfaces that we consider to be both deceptive and non-deceptive.more » « less
- 
            Dark patterns are user interface elements that can influence a person's behavior against their intentions or best interests. Prior work identified these patterns in websites and mobile apps, but little is known about how the design of platforms might impact dark pattern manifestations and related human vulnerabilities. In this paper, we conduct a comparative study of mobile application, mobile browser, and web browser versions of 105 popular services to investigate variations in dark patterns across modalities. We perform manual tests, identify dark patterns in each service, and examine how they persist or differ by modality. Our findings show that while services can employ some dark patterns equally across modalities, many dark patterns vary between platforms, and that these differences saddle people with inconsistent experiences of autonomy, privacy, and control. We conclude by discussing broader implications for policymakers and practitioners, and provide suggestions for furthering dark patterns research.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                     Full Text Available
                                                Full Text Available